ON HADAMARD DIFFERENTIABILITY AND M-ESTIMATION IN LINEAR MODELS by

نویسنده

  • Jian-Jian Ren
چکیده

Robust (M-) estimation in linear models generally involves statistical functional processes. For drawing statistical conclusions (in large samples), some (uniform) linear approximations are usually needed for such functionals. In this context, the role of Hadamard differentiability is critically examined in this dissertation. In particular, the concept of the second-order Hadamard differenti-ability and some related theoretical results are established for the study of the convergence rate in probability of the uniform asymptotic linearity of the M-estimator of regression. Thereby, using Hadamard differentiability through the linear approximation of the estimator, the asymptotic normality, the weak consistency and an asymptotic representation are derived under the weak conditions on the score function t/J, the underlying d.f. F, and the regreSSIOn constants. Some other approaches are also considered for the study of the asymptotic normality and the weak consistency, and these approaches lead to two sets of conditions for the strong consistency of the M-estimators in linear models.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Partially finite convex programming, Part II: Explicit lattice models

In Part I of this work we derived a duality theorem for partially finite convex programs, problems for which the standard Slater condition fails almost invariably. Our result depended on a constraint qualification involving the notion of quasi relative interior. The derivation of the primal solution from a dual solution depended on the differentiability of the dual objective function: the diffe...

متن کامل

Locally Lipschitz Functions and Bornological Derivatives

We study the relationships between Gateaux, Weak Hadamard and Fréchet differentiability and their bornologies for Lipschitz and for convex functions. AMS Subject Classification. Primary: 46A17, 46G05, 58C20. Secondary: 46B20.

متن کامل

A Smooth Variational Principle with Applications to Subdifferentiability and to Differentiability of Convex Functions

We show that, typically, lower semicontinuous functions on a Banach space densely inherit lower subderivatives of the same degree of smoothness as the norm. In particular every continuous convex function on a space with a Gâteaux (weak Hadamard, Fréchet) smooth renorm is densely Gâteaux (weak Hadamard, Fréchet) differentiable. Our technique relies on a more powerful analogue of Ekeland's variat...

متن کامل

Numerical solution of fuzzy differential equations under generalized differentiability by fuzzy neural network

In this paper, we interpret a fuzzy differential equation by using the strongly generalized differentiability concept. Utilizing the Generalized characterization Theorem. Then a novel hybrid method based on learning algorithm of fuzzy neural network for the solution of differential equation with fuzzy initial value is presented. Here neural network is considered as a part of large eld called ne...

متن کامل

EXTENDED PREDICTOR-CORRECTOR METHODS FOR SOLVING FUZZY DIFFERENTIAL EQUATIONS UNDER GENERALIZED DIFFERENTIABILITY

In this paper, the (m+1)-step Adams-Bashforth, Adams-Moulton, and Predictor-Correctormethods are used to solve rst-order linear fuzzy ordinary dierential equations. The conceptsof fuzzy interpolation and generalised strongly dierentiability are used, to obtaingeneral algorithms. Each of these algorithms has advantages over current methods. Moreover,for each algorithm a convergence formula can b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1990